Boltzmann machines as two-dimensional tensor networks
نویسندگان
چکیده
Restricted Boltzmann machines (RBM) and deep (DBM) are important models in machine learning, recently found numerous applications quantum many-body physics. We show that there fundamental connections between them tensor networks. In particular, we demonstrate any RBM DBM can be exactly represented as a two-dimensional network. This representation gives an understanding of the expressive power using entanglement structures networks, also provides efficient network contraction algorithm for computing partition function DBM. Using numerical experiments, proposed is much more accurate than state-of-the-art learning methods estimating restricted machines, have potential training general tasks.
منابع مشابه
Tensor-Variate Restricted Boltzmann Machines
Restricted Boltzmann Machines (RBMs) are an important class of latent variable models for representing vector data. An under-explored area is multimode data, where each data point is a matrix or a tensor. Standard RBMs applying to such data would require vectorizing matrices and tensors, thus resulting in unnecessarily high dimensionality and at the same time, destroying the inherent higher-ord...
متن کاملFrom Hopfield networks to Boltzmann machines
From Hoppeld networks to Boltzmann machines 17.1 The capacity of the Hoppeld network We will rst explore the information storage capabilities of a binary Hoppeld network which learns using the Hebb rule by considering the stability of just one bit of one of the desired patterns, assuming that the state of the network is set to that desired pattern x (n). We will assume that the patterns to be s...
متن کاملNeural Networks on GPUs: Restricted Boltzmann Machines
Despite the popularity and success of neural networks in research, the number of resulting commercial or industrial applications have been limited. A primary cause of this lack of adoption is due to the fact that neural networks are usually implemented as software running on general-purpose processors. In this paper, we investigate how GPUs can be used to take advantage of the inherent parallel...
متن کاملDeep Boltzmann Machines as Feed-Forward Hierarchies
The deep Boltzmann machine is a powerful model that extracts the hierarchical structure of observed data. While inference is typically slow due to its undirected nature, we argue that the emerging feature hierarchy is still explicit enough to be traversed in a feedforward fashion. The claim is corroborated by training a set of deep neural networks on real data and measuring the evolution of the...
متن کاملBoltzmann Machines
Up till this point we have been studying associative networks that operate in a deterministic way. That is, given particular interconnections between units, and a particular set of initial conditions, the networks would always exhibit the same dynamical behaviour (going downhill in energy), and hence always end up in the same stable state. This feature of their operation was due to the fact tha...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Physical review
سال: 2021
ISSN: ['0556-2813', '1538-4497', '1089-490X']
DOI: https://doi.org/10.1103/physrevb.104.075154